Lecture 8 : Shannon ’ s Noise Models

نویسندگان

  • Sandipan Kundu
  • Atri Rudra
چکیده

In the figure above, source coding and channel coding are coupled. However, Shannon’s source coding theorem allows us to decouple both these parts of the communication and study each of these parts separately. Intuitively, this makes sense: if one can have reliable communication over the channel using channel coding, then for the source coding the channel effectively has no noise. For source coding, Shannon proved a theorem that precisely calculated the amount by which the message can be compressed: this amount is related to the “entropy” of the message. We will however, not talk about source coding in any detail in this course. From now on, we will exclusively focus on the channel coding part of the communication setup. Note that one aspect of channel coding is how we model the channel noise. We have seen Hamming’s worst case noise model in some detail. Next, we will study some specific stochastic channels.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture 3 : Shannon ’ s theorem and some upper and lower bounds

1. Noisy Channel: This channel introduces a noise in the data during transmission and hence the receiver gets a noisy version of the transmitted data. Therefore, for reliable communication, redundancy has to be added to the data generated by the source. This redundancy is then removed at the receiver’s end. This process of introducing redundancy in the data to make it more resilient to noise is...

متن کامل

Lecture 3 : Shannon ’ s Theorem October 9 , 2006

The communication model we are using consists of a source that generates digital information. This information is sent to a destination through a channel. The communication can happen in the spatial domain (i.e., we need to send information over a physical distance on a channel) or in the time domain (i.e., we want to retrieve data that we stored at an earlier point of time). The channel can be...

متن کامل

Lecture 11 : Shannon vs . Hamming September 21 , 2007

In the last lecture, we proved the positive part of Shannon's capacity theorem for the BSC. We showed that by the probabilistic method, there exists an encoding function E and a decoding function D such that E m Pr noise e of BSCp [D(E(m) + e) = m] ≤ 2 −δ ′ n. (1) In other words, the average decoding error probability is small. However, we need to show that the maximum decoding error probabilit...

متن کامل

Lecture 11 : Shannon vs . Hamming

In the last lecture, we proved the positive part of Shannon's capacity theorem for the BSC. We showed that by the probabilistic method, there exists an encoding function E and a decoding function D such that E m Pr noise e of BSCp [D(E(m) + e) = m] ≤ 2 −δ ′ n. (1) In other words, the average decoding error probability is small. However, we need to show that the maximum decoding error probabilit...

متن کامل

Lecture 21 : Protecting against information loss : coding theory

Computer and information systems are prone to data loss—lost packets, crashed or corrupted hard drives, noisy transmissions, etc.—and it is important to prevent actual loss of important information when this happens. Today’s lecture concerns error correcting codes, a stepping point to many other ideas, including a big research area (usually based in EE departments) called information theory. Th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010